Church–Turing thesis
In computability theory, the Church–Turing thesis (also known as the Church-Turing conjecture, Church's thesis, Church's conjecture, and Turing's thesis) is a combined hypothesis ("thesis") about the nature of functions whose values are effectively calculable; i.e. computable. In simple terms, it states that "everything computable is computable by a Turing machine."
Several attempts were made in the first half of the 20th Century to formalize the notion of computability:
- American mathematician Alonzo Church created a method for defining functions called the λ-calculus,
- British mathematician Alan Turing created a theoretical model for a machine that could carry out calculations from inputs,
- Church, along with mathematician Stephen Kleene and logician J.B. Rosser created a formal definition of a class of functions whose values could be calculated by recursion.
All three computational processes (recursion, the λ-calculus, and the Turing machine) were shown to be equivalent—all three approaches define the same class of functions.[1] [2]. This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes. Informally the Church–Turing thesis states that if some method (algorithm) exists to carry out a calculation, then the same calculation can also be carried out by a Turing machine (as well as by a recursively-definable function, and by a λ-function).
The Church–Turing thesis is a statement that characterizes the nature of computation and cannot be formally proven. Even though the three processes mentioned above proved to be equivalent, the fundamental premise behind the thesis—the notion of what it means for a function to be "effectively calculable" (computable)—is "a somewhat vague intuitive one"[3]. Thus, the "thesis" remains an hypothesis.[3]
Despite the fact that it cannot be formally proven, the Church–Turing thesis now has near-universal acceptance.
Formal statement
- See also: Effectively calculable
Rosser 1939 addresses the notion of "effective computability" as follows: "Clearly the existence of CC and RC [Church's and Rosser's proofs] presupposes a precise definition of "effective". "Effective method" is here used in the rather special sense of a method each step of which is precisely predetermined and which is certain to produce the answer in a finite number of steps"[4]. Thus the adverb-adjective "effective" is used in a sense of "1a: producing a decided, decisive, or desired effect", and "syn: capable of producing a result"[5].
In the following, the words "effectively calculable" will mean "produced by any intuitively 'effective' means whatsoever" and "effectively computable" will mean "produced by a Turing-machine or equivalent mechanical device". Turing's 1939 "definitions" are virtually the same:
- "†We shall use the expression "computable function" to mean a function calculable by a machine, and we let "effectively calculable" refer to the intuitive idea without particular identification with any one of these definitions." (cf the footnote † in Turing 1939 (his Ordinals paper) in Davis 1965:160).
The thesis can be stated as follows:
- Every effectively calculable function is a computable function.[6]
Turing stated it this way:
- " It was stated ... that 'a function is effectively calculable if its values can be found by some purely mechanical process.' We may take this literally, understanding that by a purely mechanical process one which could be carried out by a machine. The development ... leads to ... an identification of computability † with effective calculability" († is the footnote above, ibid).
History
One of the important problems for logicians in the 1930s was David Hilbert's Entscheidungsproblem, which asked if there was a mechanical procedure for separating mathematical truths from mathematical falsehoods. This quest required that the notion of “algorithm” or “effective calculability” be pinned down, at least well enough for the quest to begin[7]. But from the very outset Alonzo Church's attempts began with a debate that continues to this day[8]. Was the notion of “effective calculability” to be (i) an axiom or axioms in an axiomatic system, or (ii) merely a definition that “identified” two or more propositions, or (iii) an empirical hypothesis to be verified by observation of natural events, or (iv) or just a proposal for the sake of argument (i.e. a "thesis").
Circa 1930–1952
An axiomatic approach? λ-calculus? Recursion? : In the course of studying the problem, Church and his student Stephen Kleene introduced the notion of λ-definable functions, and they were able to prove that several large classes of functions frequently encountered in number theory were λ-definable [9]. The debate began when Church proposed to Kurt Gödel that one should define the "effectively computable" functions as the λ-definable functions. Gödel, however, was not convinced and called the proposal "thoroughly unsatisfactory"[10]. Rather in correspondence with Church (ca 1934–5), Gödel proposed axiomatizing the notion of "effective calculability"; indeed, in a 1935 letter to Kleene, Church reported that:
- "His [Gödel's] only idea at the time was that it might be possible, in terms of effective calculability as an undefined notion, to state a set of axioms which would embody the generally accepted properties of this notion, and to do something on that basis"[11].
But Gödel offered no further guidance. Eventually, he would suggest his (primitive) recursion, modified by Herbrand's suggestion, that he (Gödel) had detailed in his 1934 lectures in Princeton NJ (Kleene and another student J. B. Rosser transcribed the notes.). But "he did not think that the two ideas could be satisfactorily identified "except heuristically" "[12].
"Identifying" two equivalent notions to define effective calculability, and a successful proof: Now equipped with the λ-calculus and "general" recursion, Stephen Kleene with help of Church and J. B. Rosser produced proofs (1933, 1935) to show that the two calculi are equivalent. Church subsequently modified his methods to include use of Herbrand–Gödel recursion and then proved (1936) that the Entscheidungsproblem is unsolvable: There is no generalized "effective calculation" (method, algorithm) that can determine whether or not a formula in either the recursive- or λ-calculus is "valid" (more precisely: no method to show that a well formed formula has a "normal form")[13].
Many years later in a letter to Davis (ca 1965), Gödel would confess that "he was, at the time of these [1934] lectures, not at all convinced that his concept of recursion comprised all possible recursions"[14]. By 1963-4 Gödel would disavow Herbrand–Gödel recursion and the λ-calculus in favor of the Turing machine as the definition of “algorithm” or “mechanical procedure” or “formal system”[15].
An hypothesis leading to a natural law?: In late 1936 Alan Turing’s paper (also proving that the Entscheidungsproblem is unsolvable) had not yet appeared. On the other hand, Emil Post's 1936 paper had appeared and was certified independent of Turing's work[16]. Post strongly disagreed with Church’s “identification” of effective computability with the λ-calculus and recursion, stating:
- "Actually the work already done by Church and others carries this identification considerably beyond the working hypothesis stage. But to mask this identification under a definition . . . blinds us to the need of its continual verification."[17].
Rather, he regarded the notion of “effective calculability” as merely a "working hypothesis" that might lead by inductive reasoning to a "natural law" rather than by “a definition or an axiom” [18]. This idea was "sharply" criticized by Church[19].
Thus Post in his 1936[11] was also discounting Kurt Gödel's suggestion to Church in 1934–5 that the thesis might be expressed as an axiom or set of axioms[11].
Turing adds another definition, Rosser equates all three: Within just a short time, Turing's 1936–37 paper "On Computable Numbers, with an Application to the Entscheidungsproblem" appeared. In it he asserted another notion of "effective computability" with the introduction of his a-machines (now known as the Turing machine abstract computational model). And in a proof-sketch added as an "Appendix" to his 1936–37 paper, Turing showed that the classes of functions defined by λ-calculus and Turing machines coincided.[20].
In a few years (1939) Turing would propose, like Church and Kleene before him, that his formal definition of mechanical computing agent was the correct one.[21]. Thus, by 1939, both Church (1934) and Turing (1939), neither having knowledge of the other’s efforts, had individually proposed that their "formal systems" should be definitions of "effective calculability"[22]; neither framed their assertions as theses.
Rosser (1939) formally identified the three notions-as-definitions:
- "All three definitions are equivalent, so it does not matter which one is used."[23]
Kleene proposes Church's Thesis: This left the overt expression of a "thesis” to Kleene. In his 1943 paper Recursive Predicates and Quantifiers Kleene proposed his "THESIS I":
- "This heuristic fact [general recursive functions are effectively calculable]...led Church to state the following thesis(22). The same thesis is implicit in Turing's description of computing machines(23).
- "THESIS I. Every effectively calculable function (effectively decidable predicate) is general[24] recursive [Kleene's italics]
- "Since a precise mathematical definition of the term effectively calculable (effectively decidable) has been wanting, we can take this thesis ... as a definition of it..."[25]
- "(22) references Church 1936
- "(23) references Turing 1936–7
Kleene goes on to note that:
- "...the thesis has the character of an hypothesis—a point emphasized by Post and by Church(24). If we consider the thesis and its converse as definition, then the hypothesis is an hypothesis about the application of the mathematical theory developed from the definition. For the acceptance of the hypothesis, there are, as we have suggested, quite compelling grounds."[25]
-
- "(24) references Post 1936 of Post and Church's Formal definitions in the theory of ordinal numbers, Fund. Math. vol 28 (1936) pp.11–21 (see ref. #2, Davis 1965:286).
Kleene's Church-Turing Thesis: A few years later (1952) Kleene would overtly name, defend, and express the two "theses" and then "identify" them (show equivalence) by use of his Theorem XXX:
- "Heuristic evidence and other considerations led Church 1936 to propose the following thesis.
- Thesis I. Every effectively calculable function (effectively decidable predicate) is general recursive[26].
- Theorem XXX: "The following classes of partial functions are coextensive, i.e. have the same members: (a) the partial recursive functions, (b) the computable functions. . . ".[27]
- Turing's thesis: "Turing's thesis that every function which would naturally be regarded as computable is computable under his definition, i.e. by one of his machines, is equivalent to Church's thesis by Theorem XXX."[28]
Later developments
An attempt to understand the notion of "effective computability" better led Robin Gandy (Turing's student and friend) in 1980 to analyze machine computation (as opposed to human-computation acted out by a Turing machine). Gandy's curiosity about, and analysis of, "cellular automata", "Conway's game of life", "parallelism" and "crystalline automata" led him to propose four "principles (or constraints) ... which it is argued, any machine must satisfy."[29] His most-important fourth, "the principle of causality" is based on the "finite velocity of propagation of effects and signals; contemporary physics rejects the possibility of instantaneous action at a distance."[30] From these principles and some additional constraints—(1a) a lower bound on the linear dimensions of any of the parts, (1b) an upper bound on speed of propagation (the velocity of light), (2) discrete progress of the machine, and (3) deterministic behavior—he produces a theorem that "What can be calculated by a device satisfying principles I–IV is computable.[31]".
In the late 1990s Wilfried Sieg analyzed Turing's and Gandy's notions of "effective calculability" with the intent of "sharpening the informal notion, formulating its general features axiomatically, and investigating the axiomatic framework"[32]. In his 1997 and 2002 Sieg presents a series of constraints on the behavior of a computor -- "a human computing agent who proceeds mechanically"; these constraints reduce to:
- "(B.1) (Boundedness) There is a fixed bound on the number of symbolic configurations a computor can immediately recognize.
- "(B.2) (Boundedness) There is a fixed bound on the number of internal states a computor can be in.
- "(L.1) (Locality) A computor can change only elements of an observed symbolic configuration.
- "(L.2) (Locality) A computor can shift attention from one symbolic configuration to another one, but the new observed configurations must be within a bounded distance of the immediately previously observed configuration.
- "(D) (Determinacy) The immediately recognizable (sub-)configuration determines uniquely the next computation step (and id [instantaneous description] )"; stated another way: "A computor's internal state together with the observed configuration fixes uniquely the next computation step and the next internal state."[33]
The matter remains in active discussion within the academic community[34].
Success of the thesis
Other formalisms (besides recursion, the λ-calculus, and the Turing machine) have been proposed for describing effective calculability/computability . Stephen Kleene (1952) adds to the list the functions "reckonable in the system S1" of Kurt Gödel 1936, and Emil Post's (1943, 1946) "canonical [also called normal] systems".[35] In the 1950s Hao Wang and Martin Davis greatly simplified the one-tape Turing-machine model (see Post–Turing machine). Marvin Minsky expanded the model to two or more tapes and greatly simplified the tapes into "up-down counters", which Melzak and Lambek further evolved into what is now known as the counter machine model. In the late 1960s and early 1970s researchers expanded the counter machine model into the register machine, a close cousin to the modern notion of the computer. Other models include combinatory logic and Markov algorithms. Gurevich adds the pointer machine model of Kolmogorov and Uspensky (1953, 1958): "...they just wanted to ... convince themselves that there is no way to extend the notion of computable function."[36]
All these contributions involve proofs that the models are computationally equivalent to the Turing machine; such models are said to be Turing complete. Because all these different attempts at formalizing the concept of "effective calculability/computability" have yielded equivalent results, it is now generally assumed that the Church–Turing thesis is correct. In fact, Gödel (1936) proposed something stronger than this; he observed that there was something "absolute" about the concept of "reckonable in S1":
- "It may also be shown that a function which is computable ['reckonable'] in one of the systems Si, or even in a system of transfinite type, is already computable [reckonable] in S1. Thus the concept 'computable' ['reckonable'] is in a certain definite sense 'absolute', while practically all other familiar metamathematical concepts (e.g. provable, definable, etc.) depend quite essentially on the system to which they are defined"[37]
Variations
The success of the Church–Turing thesis prompted variations of the thesis to be proposed. For example, the Physical Church–Turing thesis (PCTT) states:
-
- "According to Physical CTT, all physically computable functions are Turing-computable"[38]
The Church-Turing thesis says nothing about the efficiency with which one model of computation can simulate another. It has been proved for instance that a (multi-tape) universal Turing machine only suffers a logarithmic slowdown factor in simulating any Turing machine.[39] No such result has been proved in general for an arbitrary but reasonable model of computation. A variation of the Church-Turing thesis that addresses this issue is the Feasibility Thesis[40] or (Classical) Complexity-Theoretic Church–Turing Thesis (SCTT), which is not due to Church or Turing, but rather was realized gradually in the development of complexity theory. It states:[41]
- "A probabilistic Turing machine can efficiently simulate any realistic model of computation."
The word 'efficiently' here means up to polynomial-time reductions. This thesis was originally called Computational Complexity-Theoretic Church–Turing Thesis by Ethan Bernstein and Umesh Vazirani (1997). The Complexity-Theoretic Church–Turing Thesis, then, posits that all 'reasonable' models of computation yield the same class of problems that can be computed in polynomial time. Assuming the conjecture that probabilistic polynomial time (BPP) equals deterministic polynomial time (P), the word 'probabilistic' is optional in the Complexity-Theoretic Church–Turing Thesis. A similar thesis, called the Invariant Thesis, was introduced by Cees F. Slot and Peter van Emde Boas. It states: "Reasonable" machines can simulate each other within a polynomially bounded overhead in time and a constant-factor overhead in space.[42] The thesis originally appeared in a paper at STOC'84, which was the first paper to show that polynomial-time overhead and constant-space overhead could be simultaneously achieved for a simulation of a Random Access Machine on a Turing machine.[43]
If production-scale quantum computers can be built,[44] they could invalidate the Complexity-Theoretic Church–Turing Thesis, since it is also conjectured that quantum polynomial time (BQP) is larger than BPP. In other words, there are efficient quantum algorithms that perform tasks that are not known to have efficient probabilistic algorithms; for example, factoring integers. They would not however invalidate the original Church–Turing thesis, since a quantum computer can always be simulated by a Turing machine, but they would invalidate the classical Complexity-Theoretic Church-Turing thesis for efficiency reasons. Consequently, the Quantum Complexity-Theoretic Church-Turing thesis states:[41]
- "A quantum Turing machine can efficiently simulate any realistic model of computation."
Philosophical implications
The Church–Turing thesis has some profound implications for the philosophy of mind; many of the philosophical interpretations of the Thesis however involve basic misunderstandings of the thesis statement.[45] B. Jack Copeland states that it's an open empirical question whether there are actual deterministic physical processes that, in the long run, elude simulation by a Turing machine; furthermore, he states that it is an open empirical question whether any such processes are involved in the working of the human brain.[46] There are also some important open questions which cover the relationship between the Church–Turing thesis and physics, and the possibility of hypercomputation. When applied to physics, the thesis has several possible meanings:
- The universe is equivalent to a Turing machine; thus, computing non-recursive functions is physically impossible. This has been termed the Strong Church–Turing thesis and is a foundation of digital physics.
- The universe is not equivalent to a Turing machine (i.e., the laws of physics are not Turing-computable), but incomputable physical events are not "harnessable" for the construction of a hypercomputer. For example, a universe in which physics involves real numbers, as opposed to computable reals, might fall into this category.
- The universe is a hypercomputer, and it is possible to build physical devices to harness this property and calculate non-recursive functions. For example, it is an open question whether all quantum mechanical events are Turing-computable, although it is known that rigorous models such as quantum Turing machines are equivalent to deterministic Turing machines. (They are not necessarily efficiently equivalent; see above.) John Lucas and, more famously, Roger Penrose[47] have suggested that the human mind might be the result of some kind of quantum-mechanically enhanced, "non-algorithmic" computation, although there is no scientific evidence for this proposal.
There are many other technical possibilities which fall outside or between these three categories, but these serve to illustrate the range of the concept.
Non-computable functions
One can formally define functions that are not computable. A well known example of such a function is the busy beaver function. This function takes an input n and returns the largest number of symbols that a Turing machine with n states can print before halting, when run with no input. Using particular models of Turing machines, researchers have computed the value of this function for small values of n: 0 through 4. Simulations of Turing machines with 5 and 6 states have been performed, but without conclusive results. For higher values, only lower bounds have been given. Finding an upper bound on the busy beaver function is equivalent to solving the halting problem, a problem known to be unsolvable by Turing machines. Since the busy beaver function cannot be computed by Turing machines, the Church–Turing thesis asserts that this function cannot be effectively computed by any method.
Mark Burgin, Eugene Eberbach, Peter Kugel, and other researchers argue that super-recursive algorithms such as inductive Turing machines disprove the Church–Turing thesis. Their argument relies on a definition of algorithm broader than the ordinary one, so that non-computable functions obtained from some inductive Turing machines are called computable. This interpretation of the Church–Turing thesis differs from the interpretation commonly accepted in computability theory, discussed above. The argument that super-recursive algorithms are indeed algorithms in the sense of the Church–Turing thesis has not found broad acceptance within the computability research community.
See also
- Church's thesis in constructive mathematics
- Computability logic
- Computability theory
- Decidability
- History of the Church–Turing thesis
- Hypercomputer
- Super-recursive algorithm
- Church–Turing–Deutsch principle, which states that every physical process can be simulated by a universal computing device
- ↑ Church 1934:90 footnote in Davis 1952
- ↑ Turing 1936–7 in Davis 1952:149
- ↑ 3.0 3.1 Kleene 1952:317
- ↑ Rosser 1939 in Davis 1965:225
- ↑ Merriam Webster's Ninth New Collegiate Dictionary
- ↑ Gandy (Gandy 1980 in Barwise 1980:123) states it this way: What is effectively calculable is computable. He calls this "Church's Thesis", a peculiar choice of moniker.
- ↑ Davis’s commentary before Church 1936 ‘’An Unsolvable Problem of Elementary Number Theory’’ in Davis 1965:88. Church uses the words “effective calculability” on page 100ff.
- ↑ In his review of Church’s Thesis after 70 Years edited by Adam Olszewski et al. (full reference TBD), Peter Smith's criticism of a paper by Muraswski and Wolenski suggests 4 "lines" re the status of the Church-Turing Thesis: (1) empirical hypothesis (2) axiom or theorem, (3) definition, (4) explication. But Smith opines that (4) is indistinguishable from (3), cf Smith (July 11, 2007) Church’s Thesis after 70 Years at http://www.phil.cam.ac.uk/teaching_staff/Smith/godelbook/other/CTT.pdf
- ↑ cf footnote 3 in Church 1936 ‘’An Unsolvable Problem of Elementary Number Theory’’ in Davis 1965:89
- ↑ Dawson 1997:99
- ↑ 11.0 11.1 11.2 Sieg 1997:160
- ↑ Sieg 1997:160 quoting from the 1935 letter written by Church to Kleene, cf Footnote 3 in Gödel 1934 in Davis 1965:44
- ↑ cf Church 1936 in Davis 1965:105ff
- ↑ Davis's commentary before Gödel 1934 in Davis 1965:40
- ↑ For a detailed discussion of Gödel's adoption of Turing's machines as models of computation, see Shagrir date TBD at http://edelstein.huji.ac.il/staff/shagrir/papers/Goedel_on_Turing_on_Computability.pdf
- ↑ cf Editor's footnote to Post 1936 Finite Combinatory Process. Formulation I. at Davis 1965:289.
- ↑ Post 1936 in Davis 1965:291 footnote 8
- ↑ Post 1936 in Davis 1952:291
- ↑ Sieg 1997:171 and 176–7)
- ↑ Turing 1936–7 in Davis 1965:263ff
- ↑ Turing 1939 in Davis:160
- ↑ cf Church 1934 in Davis 1965:100, also Turing 1939 in Davis 1965:160)
- ↑ italics added, Rosser 1939 in Davis 1965:226
- ↑ An archaic usage of Kleene et al. to distinguish Gödel's (1931) "rekursiv" (a few years later named primitive recursion by Rózsa Péter (cf Gandy 1994 in Herken 1994–5:68)) from Herbrand–Gödel's recursion of 1934 i.e. primitive recursion equipped with the additional mu operator; nowadays mu-recursion is called, simply, "recursion".
- ↑ 25.0 25.1 Kleene 1943 in Davis 1965:274
- ↑ Kleene 1952:300
- ↑ Kleene 1952:376
- ↑ Kleene 1952:376)
- ↑ Gandy 1980 in Barwise 1980:123ff)
- ↑ Gandy 1980 in Barwise 1980:135
- ↑ Gandy 1980 in Barwise:126
- ↑ (Sieg 1998–9 in Sieg-Somner-Talcott 2002:390ff; also Sieg 1997:154ff)
- ↑ In a footnote Sieg breaks Post's 1936 (B) into (B.1) and (B.2) and (L) into (L.1) and (L.2) and describes (D) differently. With respect to his proposed Gandy machine he later adds LC.1, LC.2, GA.1 and GA.2. These are complicated; see Sieg 1998–9 in Sieg–Somner–Talcott 2002:390ff.
- ↑ A collection of papers can be found at Church’s Thesis after 70 Years edited by Adam Olszewski et al. (full reference TBD). Also a review of this collection by Peter Smith (July 11, 2007) Church’s Thesis after 70 Years at http://www.phil.cam.ac.uk/teaching_staff/Smith/godelbook/other/CTT.pdf
- ↑ Kleene 1952:320
- ↑ Gurevich 1988:2
- ↑ translation of Gödel (1936) by Davis in The Undecidable p. 83, differing in the use of the word 'reckonable' in the translation in Kleene (1952) p. 321
- ↑ Piccinini 2007:101 from Gualtiero Piccinini in Synthese (2007) 154:97–120. http://www.umsl.edu/~piccininig/Computationalism_Church-Turing_Thesis_Church-Turing_Fallacy.pdf
- ↑ Arora, Sanjeev; Barak, Boaz, "Complexity Theory: A Modern Approach", Cambridge University Press, 2009, ISBN 978-0-521-42426-4, section 1.4, "Machines as strings and the universal Turing machine" and 1.7, "Proof of theorem 1.9"
- ↑ http://www.claymath.org/millennium/P_vs_NP/Official_Problem_Description.pdf
- ↑ 41.0 41.1 Phillip Kaye, Raymond Laflamme, Michele Mosca, An introduction to quantum computing, Oxford University Press, 2007, ISBN 019857049X, pp. 5-6
- ↑ Peter van Emde Boas's, Machine Models and Simulations, in Handbook of Theoretical Computer Science A, Elsevier, 1990, p. 5
- ↑ C. Slot, P. van Emde Boas, On tape versus core: an application of space efficient perfect hash functions to the invariance of space, STOC, December 1984
- ↑ Liesbeth Venema, Quantum information: Reality check, Nature 450, 175-176 (8 November 2007) doi:10.1038/450175a
- ↑ In particular see the numerous examples (of errors, of misappropriation of the thesis) at the entry in the Stanford Encyclopedia of Philosophy. For a good place to encounter original papers see David J. Chalmers, ed. 2002, Philosophy of Mind: Classical and Contemporary Readings, Oxford University Press, New York.
- ↑ B. Jack Copeland, Computation in Luciano Floridi (ed.), The Blackwell guide to the philosophy of computing and information, Wiley-Blackwell, 2004, ISBN 0631229191, p. 15
- ↑ cf his subchapter "The Church-Turing Thesis" (p. 47–49) in his chapter "Algorithms and Turing machines" in his 1990 (2nd edition) Emperor's New Mind: Concerning Computers, Minds, and the Laws of Physics, Oxford University Press, Oxford UK. Also his a final chapter titled "Where lies the physics of mind?" where, in a subsection he asserts "The non-algorithmic nature of mathematical insight" (p. 416–8).
References
- Bernstein, E; Vazirani, U. (1997). "Quantum complexity theory". SIAM Journal on Computing 26 (5): 1411–1473. doi:10.1137/S0097539796300921.
- Blass, Andreas; Yuri Gurevich (2003). "Algorithms: A Quest for Absolute Definitions". Bulletin of European Association for Theoretical Computer Science (81). http://research.microsoft.com/~gurevich/Opera/164.pdf.
- Burgin, Mark (2005). "Super-recursive algorithms". Monographs in computer science. Springer. ISBN 0387955690.
- Church, Alonzo (1932). "A set of Postulates for the Foundation of Logic". Annals of Mathematics 2 (33): 346–366.
- Church, Alonzo (1936). "An Unsolvable Problem of Elementary Number Theory". American Journal of Mathematics (58): 345–363.
- Church, Alonzo (1936). "A Note on the Entscheidungsproblem". Journal of Symbolic Logic (1): 40–41.
- Church, Alonzo (1941). The Calculi of Lambda-Conversion. Princeton: Princeton University Press.
- Cooper, S. B.; Odifreddi, P. (2003). "Incomputability in Nature". In S. B. Cooper & S. S. Goncharov. Computability and Models: Perspectives East and West. Kluwer Academic/Plenum Publishers. pp. 137–160.
- Martin Davis, ed (1965). The Undecidable, Basic Papers on Undecidable Propositions, Unsolvable Problems And Computable Functions. New York: Raven Press. Includes original papers by Gödel, Church, Turing, Rosser, Kleene, and Post mentioned in this section.
- Eberbach, E.; Wegner, P. (October 2003). "Beyond Turing Machines". Bulletin of the European Association for Theoretical Computer Science (81): 279–304.
- Gandy, Robin (1980). "Church's Thesis and the Principles for Mechanisms". In H.J. Barwise, H.J. Keisler, and K. Kunen. The Kleene Symposium. North-Holland Publishing Company. pp. 123–148.
- Gandy, Robin (1994–5). Rolf Herken. ed. The universal Turing Machine: A Half-Century Survey. New York: Wien Springer–Verlag. pp. 51ff. ISBN 3-211-82637-8.
- Gödel, Kurt (1965) [1934]. "On Undecidable Propositions of Formal Mathematical Systems". In Davis, M.. The Undecidable. Kleene and Rosser (lecture note-takers); Institute for Advanced Study (lecture sponsor). New York: Raven Press.
- Gödel, Kurt (1936). "On The Length of Proofs" (in German). Ergenbnisse eines mathematishen Kolloquiums (Heft) (7): 23–24. Cited by Kleene (1952) as "Über die Lāange von Beweisen", in Ergebnisse eines math. Koll, etc.
- Gurevich, Yuri (June 1988). "On Kolmogorov Machines and Related Issues". Bulletin of European Association for Theoretical Computer Science (35): 71–82.
- Gurevich, Yuri (July 2000). "Sequential Abstract State Machines Capture Sequential Algorithms". ACM Transactions on Computational Logic 1 (1): 77–111. doi:10.1145/343369.343384. http://research.microsoft.com/~gurevich/Opera/141.pdf.
- Herbrand, Jacques (1932). "Sur la non-contradiction de l'arithmétique". Journal fur die reine und angewandte Mathematik (166): 1–8.
- Hofstadter, Douglas R.. "Chapter XVII: Church, Turing, Tarski, and Others". Gödel, Escher, Bach: an Eternal Golden Braid.
- Kleene, Stephen Cole (1935). "A Theory of Positive Integers in Formal Logic". American Journal of Mathematics (57): 153–173 & 219–244.
- Kleene, Stephen Cole (1936). "Lambda-Definability and Recursiveness". Duke Mathematical Journal (2): 340–353.
- Kleene, Stephen Cole (1943). "Recursive Predicates and Quantifiers". American Mathematical Society Transactions (Transactions of the American Mathematical Society, Vol. 53, No. 1) 54 (1): 41–73. doi:10.2307/1990131. http://jstor.org/stable/1990131. Reprinted in The Undecidable, p. 255ff. Kleene refined his definition of "general recursion" and proceeded in his chapter "12. Algorithmic theories" to posit "Thesis I" (p. 274); he would later repeat this thesis (in Kleene 1952:300) and name it "Church's Thesis" (Kleene 1952:317) (i.e., the Church thesis).
- Kleene, Stephen Cole (1952). Introduction to Metamathematics. North-Holland. OCLC 523942.
- Knuth, Donald (1973). The Art of Computer Programming. 1/Fundamental Algorithms (2nd ed.). Addison–Wesley.
- Kugel, Peter (November 2005). "Communications of the ACM". It's time to think outside the computational box 48 (11).
- Lewis, H.R.; Papadimitriou, C.H. (1998). Elements of the Theory of Computation. Upper Saddle River, NJ, USA: Prentice-Hall.
- Manna, Zohar (1974) [2003]. Mathematical Theory of Computation. Dover. ISBN 9780486432380.
- "The Theory of Algorithms". American Mathematical Society Translations 2 (15): 1–14. 1960.
- Pour-El, M.B.; Richards, J.I. (1989). Computability in Analysis and Physics. Springer Verlag.
- Rosser, J. B. (1939). "An Informal Exposition of Proofs of Godel's Theorem and Church's Theorem". The Journal of Symbolic Logic (The Journal of Symbolic Logic, Vol. 4, No. 2) 4 (2): 53–60. doi:10.2307/2269059. http://jstor.org/stable/2269059.
- Soare, Robert (1996). "Computability and Recursion". Bulletin of Symbolic Logic (2): 284–321.
- Syropoulos, Apostolos (2008). Hypercomputation: Computing Beyond the Church–Turing Barrier. Springer. ISSN 9780387308869.
- Turing, A.M. (1936). "On Computable Numbers, with an Application to the Entscheidungsproblem". Proceedings of the London Mathematical Society 42: pp. 230–265. 1937. doi:10.1112/plms/s2-42.1.230 (and Turing, A.M. (1938). "On Computable Numbers, with an Application to the Entscheidungsproblem: A correction". Proceedings of the London Mathematical Society 43: pp. 544–546. 1937. doi:10.1112/plms/s2-43.6.544 )
External links